ACS Publications Division - to list of Journals
Pharmaceutical Centuryto Pharmaceutical Century home
Analytical Chemistry | Chemical & Engineering News | Modern Drug Discovery
| Today's Chemist at Work | E-Mail Us | Electronic Readers Service 
1940s
opening art
Corporate Sponsors
ChemNavigator.com
Eastman Chemical Co.
Hickson DanChem Corp.
Hitachi Instruments, Inc.
Marvel Scientific 
Olis, Inc.
Penreco
PepTech Corp.
SKW
Society for BioMolecular Screening
Takasago International Corp.
Antibiotics & isotopes

Introduction  As the American public danced to the beat of the Big Band era, so did pharmacology swing into action with the upbeat tone of the dawning antibiotic era.

The latter is the nickname most commonly used for the 1940s among scientists in the world of biotechnology and pharmaceuticals. The nickname is more than justified, given the numerous impressive molecules developed during the decade. Many “firsts” were accomplished in the drug discovery industry of the Forties, but no longer in a serendipitous fashion as before. Researchers were actually looking for drugs and finding them.

To appreciate the events that paved the way for the advancements of the 1940s, one need only look back to 1939, when Rene Dubos of the Rockefeller Institute for Medical Research discovered and isolated an antibacterial compound—tyrothricin, from the soil microbe Bacillus brevis—capable of destroying Gram-positive bacteria.Before this discovery, penicillin and the sulfa drugs had been discovered “accidentally.” Dubos planned his experiment to search soil for microorganisms that could destroy organisms related to diseases. “It was a planned experiment. It wasn’t a chance observation,” explains H. Boyd Woodruff, who worked on the penicillin project at Merck in the early 1940s, “[and because the experiment] had been successful, it sort of opened the field in terms of looking at soil for microorganisms that kill disease organisms.”

Fleming’s serendipity
The most famous example of serendipity in the 20th century has to be the discovery of penicillin by Alexander Fleming as discussed in the previous chapter. Although Fleming observed the antibiotic properties of the mold Penicillium notatum in 1928, it was another 12 years before the active ingredient, penicillin, was isolated and refined.

Of course, as in most of history, there are some inconsistencies. Fleming was not the first scientist to observe the antibacterial action of penicillin. In 1896 in Lyon, France, Ernest Augustin Duchesne studied the survival and growth of bacteria and molds, separately and together. He observed that the mold Penicillium glaucum had antibacterial properties against strains of both Escherichia coli and typhoid bacilli. The antibacterial properties of penicillin serendipitously surfaced at least three times during the course of scientific history before scientists used its power. And the third time was definitely a charm.

Finding a magic bullet
An Oxford University student of pathology, Howard Walter Florey’s early research interests involved mucus secretion and lysozyme—an antibacterial enzyme originally discovered by Fleming. The more he learned about the antibacterial properties of lysozyme and intestinal mucus, the more interested he became in understanding the actual chemistry behind the enzymatic reactions. However, he did not have the opportunity to work with chemists until 1935, when Florey hired Ernst Boris Chain to set up a biochemistry section in the department of pathology at the Sir William Dunn School of Pathology at Oxford. Because Chain was a chemist, Florey encouraged him to study the molecular action of lysozyme. Florey wanted to find out whether lysozyme played a role in duodenal ulcers and was less interested in its antibacterial properties.

During a scientific literature search on bacteriolysis, Chain came upon Fleming’s published report of penicillin, which had, as Chain describes, “sunk into oblivion in the literature.” Chain thought that the active substance inducing staphylococcus lysis might be similar to lysozyme, and that their modes of action might also be similar. He set out to isolate penicillin to satisfy his own scientific curiosity and to answer a biological problem—what reaction lysozyme catalyzes—not to find a drug.

The scientific collaborations and discussions between Chain and Florey eventually laid the foundation for their 1939 funding application to study the antimicrobial products of microorganisms. It never crossed their minds that one of these antimicrobial products would be the next magic bullet. The timing of their funded research is also significant—it occurred within months of Great Britain’s declaration of war with Germany and the beginning of World War II.

Because of its activity against staphylococcus, Fleming’s penicillin was one of the first compounds chosen for the study. The first step toward isolating penicillin came in March 1940 at the suggestion of colleague Norman G. Heatley. The team extracted the acidified culture filtrate into organic solution and then re-extracted penicillin into a neutral aqueous solution. In May, Florey examined the chemotherapeutic effects of penicillin by treating four of eight mice infected with Streptomyces pyogenes. The mice treated with penicillin survived, whereas the other 4 died within 15 hours. In September, Henry Dawson and colleagues confirmed the antibiotic properties of penicillin by taking a bold step and injecting penicillin into a patient at Columbia Presbyterian Hospital (New York).

With the help of Chain, Heatley, Edward P. Abraham, and other Dunn School chemists, Florey was able to scrape up enough penicillin to perform clinical trials at the Radcliffe Infirmary in Oxford in February 1941. The first patient treated was dying of S. aureus and S. pyogenes. Treatment with penicillin resulted in an amazing recovery, but because of insufficient quantities of the drug, the patient eventually died after a relapse.

Over the next three months, five other patients responded well when treated with penicillin. All of these patients were seriously ill with staphylococcal or streptococcal infections that could not be treated with sulfonamide. These trials proved the effectiveness of penicillin when compared to the sulfa drugs, which at the time were considered the gold standard for treating infections.

Producing penicillin
Florey had difficulties isolating the quantities of penicillin required to prove its value. In the early years, the Oxford team grew the mold by surface culture in anything they could lay their hands on. Because of the war, they couldn’t get the glass flasks they wanted, so they used bedpans until Florey convinced a manufacturer to make porcelain pots, which incidentally resembled bedpans. Britain was deep into the war, and the British pharmaceutical industry did not have the personnel, material, or funds to help Florey produce penicillin.

Florey and Heatley came to the United States in June 1941 to seek assistance from the American pharmaceutical industry. They traveled around the country but could not garner interest for the project. Because of the as yet ill-defined growing conditions for P. notatum and the instability of the active compound, the yield of penicillin was low and it was not economically feasible to produce. Florey and Heatley ended up working with the U.S. Department of Agriculture’s Northern Regional Research Laboratory in Peoria, IL.

The agricultural research center had excellent fermentation facilities, but more importantly—unlike any other facility in the country—it used corn steep liquor in the medium when faced with problematic cultures. This liquor yielded remarkable results for the penicillin culture. The production of penicillin increased by more than 10-fold, and the resulting penicillin was stable. It turns out that the penicillin (penicillin G) produced at the Peoria site was an entirely different compound from the penicillin (penicillin F) produced in Britain. Fortunately for all parties involved, penicillin G demonstrated the same antibacterial properties against infections as penicillin F. With these new developments, Merck, Pfizer, and Squibb agreed to collaborate on the development of penicillin.

By this time, the United States had entered the war, and the U.S. government was encouraging pharmaceutical companies to collaborate and successfully produce enough penicillin to treat war-related injuries. By 1943, several U.S. pharmaceutical companies were mass-producing purified penicillin G (~21 billion dosage units per month), and it became readily available to treat bacterial infections contracted by soldiers. In fact, by 1944, there was sufficient penicillin to treat all of the severe battle wounds incurred on D-day at Normandy. Also, diseases like syphilis and gonorrhea could suddenly be treated more easily than with earlier treatments, which included urethra cleaning and doses of noxious chemicals such as mercury or Salvarsan. The Americans continued to produce penicillin at a phenomenal rate, reaching nearly 7 trillion units per month in 1945. Fleming, Florey, and Chain were recognized “for the discovery of penicillin and its curative effect in various infectious diseases” in 1945 when they received the Nobel Prize in Physiology or Medicine.

But all magic bullets lose their luster, and penicillin was no different. Dubos had the foresight to understand the unfortunate potential of antibiotic-resistant bacteria and encouraged prudent use of antibiotics. As a result of this fear, Dubos stopped searching for naturally occurring compounds with antibacterial properties.
BIOGRAPHY: A.J.P. Martin, chromatographer extraordinaire
In 1941, Archer John Porter Martin and Richard Laurence Millington Synge, working for the Wool Industries Research Association in England, came up with liquid–liquid partition chromatography. It was arguably the most significant tool for linking analytical chemistry to the life sciences and helped create and define molecular biology research. Partition chromatography was developed to separate the various amino acids that made up proteins. In 1944, Martin, with his colleagues R. Consden and A. H. Gordon, developed paper chromatography—another of the most important methods used in the early days of biotechnology. This enabled the routine isolation and identification of nucleic acids and amino acids unattainable by column chromatography. Amazingly enough, in 1950, Martin, with yet another colleague, Anthony T. James, developed gas–liquid partition chromatography from an idea Martin and Synge had put forward in their 1941 paper. These technologies, along with a host of others begun in the 1940s, would make possible the rational drug discovery breakthroughs of the coming decades.
As early as 1940, Abraham and Chain identified a strain of S. aureus that could not be treated with penicillin. This seemingly small, almost insignificant event foreshadowed the wave of antibiotic-resistant microorganisms that became such a problem throughout the medical field toward the end of the century.

Malaria and quinine
Although penicillin was valuable against the battle-wound infections and venereal diseases that have always afflicted soldiers, it was not effective against the malaria that was killing off the troops in the mosquito-ridden South Pacific. The Americans entered Guadal canal in June 1942, and by August there were 900 cases of malaria; in September, there were 1724, and in October, 2630. By December 1942, more than 8500 U.S. soldiers were hospitalized with malaria. Ninety percent of the men had contracted the disease, and in one hospital, as many as eight of every 10 soldiers had malaria rather than combat-related injuries.

The only available treatment, however, was the justifiably unpopular drug Atabrine. Besides tasting bitter, the yellow Atabrine pills caused headaches, nausea, vomiting, and in some cases, temporary psychosis. It also seemed to leave a sickly hue to the skin and was falsely rumored to cause impotence. Nevertheless, it was effective and saved lives. Firms such as Abbott, Lilly, Merck, and Frederick Stearns assured a steady supply of Atabrine, producing 3.5 billion tablets in 1944 alone.

But Atabrine lacked the efficacy of quinine, which is isolated from cinchona, an evergreen tree native to the mountains of South and Central America. Unfortunately, the United States did not have a sufficient supply of quinine in reserve when the war broke out. As a result, the U.S. government established the Cinchona Mission in 1942. Teams of botanists, foresters, and assistants went to South America to find and collect quinine-rich strains of the tree—a costly, strenuous, and time-consuming task.

Out of desperation, research to develop antimalarials intensified. As an unfortunate example of this desperation, prison doctors in the Chicago area experimentally infected nearly 400 inmates with malaria during their search for a therapeutic. Although aware that they were helping the war effort, the prisoners were not given sufficient information about the details and risks of the clinical experiments. After the war, Nazi doctors on trial for war crimes in Nuremberg referred to this incident as part of their defense for their criminal treatment of prisoners while aiding the German war effort.

In 1944, William E. Doering and Robert B. Woodward synthesized quinine—a complex molecular structure—from coal tar. Woodward’s achievements in the art of organic synthesis earned him the 1965 Nobel Prize in Chemistry. Chloroquine, another important antimalarial, was synthesized and studied under the name of Resochin by the German company Bayer in 1934 and rediscovered in the mid-1940s. Even though chloroquine-resistant parasites cause illness throughout the world, the drug is still the primary treatment for malaria. 
SOCIETY: Atoms for peace?
Although “Atoms for Peace” became a slogan in the 1950s, in the 1940s the case was just the opposite. By the start of the decade, Ernest Lawrence’s Radiation Laboratory and its medical cyclotron, which began operation in the 1930s to produce experimental therapeutics, were transformed to the wartime service of atom bomb research. Ultimately, the Manhattan Project yielded both the Hiroshima and Nagasaki bombs, but also what the Department of Energy refers to as “a new and secret world of human experimentation.” Tests on informed and uninformed volunteers, both civilian and military, of the effects of radiation and fallout flourished simultaneously in a world where, for the first time, sufficient quantities of a wide variety of radioactive materials became available for use in a dizzying array of scientific research and medical experimentation. Radioisotopes would become the tool for physiological research in the decades that followed.

Streptomycin and tuberculosis
When Dubos presented his results with tyrothricin at the Third International Congress for Microbiology in New York in 1939, Selman A. Waksman was there to see it. The successful development of penicillin and the discovery of tyrothricin made Waksman realize the enormous potential of soil as a source of druglike compounds. He immediately decided to focus on the medicinal uses of antibacterial soil microbes.

In 1940, Woodruff and Waksman isolated and purified actinomycin from Actinomyces griseus (later named Streptomyces griseus), which led to the discovery of many other antibiotics from that same group of microorganisms. Actinomycin attacks Gram-negative bacteria responsible for diseases like typhoid, dysentery, cholera, and undulant fever and was the first antibiotic purified from an actinomycete. Considered too toxic for the treatment of diseases in animals or humans, actinomycin is primarily used as an investigative tool in cell biology. In 1942, the two researchers isolated and purified streptothricin, which prevents the proliferation of Mycobacterium tuberculosis but is also too toxic for human use.

A couple of years later, in 1944, Waksman, with Albert Schatz and Elizabeth Bugie, isolated the first aminoglycoside, streptomycin, from S. griseus. Like penicillin, aminoglycosides decrease protein synthesis in bacterial cells, except that streptomycin targets Gram-positive organisms instead of Gram-negatives. Waksman studied the value of streptomycin in treating bacterial infections, especially tuberculosis. In 1942, several hundred thousand deaths resulted from tuberculosis in Europe, and another 5 to 10 million people suffered from the disease. Although sulfa drugs and penicillin were readily available, they literally had no effect.

Merck immediately started manufacturing streptomycin with the help of Woodruff. A consultant for Merck, Waksman sent Woodruff to Merck to help with the penicillin project, and after finishing his thesis, Woodruff continued working there. Simultaneously, studies by W. H. Feldman and H. C. Hinshaw at the Mayo Clinic confirmed streptomycin’s efficacy and relatively low toxicity against tuberculosis in guinea pigs. On November 20, 1944, doctors administered streptomycin for the first time to a seriously ill tuberculosis patient and observed a rapid, impressive recovery. No longer unconquerable, tuberculosis could be tamed and beaten into retreat. In 1952, Waksman was awarded the Nobel Prize in Physiology or Medicine for his discovery of streptomycin—1 of 18 antibiotics discovered under his guidance—and its therapeutic effects in patients suffering from tuberculosis.

Merck had just developed streptomycin and moved it into the marketplace when the company stumbled upon another great discovery. At the time, doctors treated patients with pernicious anemia by injecting them with liver extracts, which contained a factor required for curing and controlling the disease. When patients stopped receiving injections, the disease redeveloped. The Merck chemists had been working on isolating what was called the pernicious anemia factor from liver extracts, and they decided to look at the cultures grown by Woodruff and other microbiologists at Merck, to see if one of the cultures might produce the pernicious anemia factor. They found a strain of S. griseus similar to the streptomycin-producing strain that made the pernicious anemia factor.

With the help of Mary Shorb’s Lactobacillus lactis assay to guide the purification and crystallization of the factor, Merck scientists were able to manufacture and market the factor as a cure for pernicious anemia. The factor turned out to be a vitamin, and it was later named vitamin B12. As Woodruff describes the period, “So, we jumped from penicillin to streptomycin to vitamin B12. We got them 1-2-3, bang-bang-bang.” Merck struck gold three times in a row. The United Kingdom’s only woman Nobel laureate, Dorothy Crowfoot Hodgkin, solved the molecular structure of vitamin B12 in 1956—just as she had for penicillin in the 1940s, a discovery that was withheld from publication until World War II was over. 
TECHNOLOGY: Transforming DNA
The 1940s was the era that changed DNA from just a chemical curiosity to the acknowledged home of the gene—a realization that would have profound impact for the pharmaceutical industry by the end of the century. In the mid-1930s, Oswald T. Avery, after discovering a pneumonia-fighting enzyme with Dubos and being disheartened by its lack of utility compared with Prontosil, moved into the field of molecular biology. His greatest achievement came in 1944 with Colin MacLeod and Maclyn McCarty, when the team showed that DNA constitutes the genetic material in cells. Their research demonstrated that pancreatic deoxyribonuclease, but not pancreatic ribonuclease or proteolytic enzymes, destroy the “transforming principle.” This factor that could move genetic information from one type of bacteria to another was naked DNA. In a sense, this was really the experiment that set the stage for Watson and Crick in the 1950s and the marvels of genetic engineering yet to come.

The continuing search
After developing penicillin, U.S. pharmaceutical companies continued to search for “antibiotics,” a term coined by P. Vuillemin in 1889 but later defined by Waksman in 1947 as those chemical substances “produced by microbes that inhibit the growth of and even destroy other microbes.”

In 1948, Benjamin M. Duggar, a professor at the University of Wisconsin and a consultant to Lederle, isolated chlortetracycline from Streptomyces aureofaciens. Chlortetracycline, also called aureomycin, was the first tetracycline antibiotic and the first broad-spectrum antibiotic. Active against an estimated 50 disease organisms, aureomycin works by inhibiting protein synthesis. The discovery of the tetracycline ring system also enabled further development of other important antibiotics.

Other antibiotics with inhibitory effects on cell wall synthesis were also discovered in the 1940s and include cephalosporin and bacitracin. Another ß-lactam antibiotic, cephalosporin was first isolated from Cephalosporium acremonium in 1948 by Guiseppe Brotzu at the University of Cagliari in Italy. Bacitracin, first derived from a strain of Bacillus subtilis, is active against Gram-positive bacteria and is used topically to treat skin infections.

Nonantibiotic therapeutics
Even though Lederle Laboratories was a blood processing plant during World War II, it evolved into a manufacturer of vitamins and nutritional products, including folic acid. Sidney Farber, a cancer scientist at Boston’s Children’s Hospital, was testing the effects of folic acid on cancer. Some of his results, which now look dubious, suggested that folic acid worsened cancer conditions, inspiring chemists at Lederle to make antimetabolites—structural mimics of essential metabolites that interfere with any biosynthetic reaction involving the intermediates—resembling folic acid to block its action. These events led to the 1948 development of methotrexate, one of the earliest anticancer agents and the mainstay of leukemia chemotherapy.

But the pioneer of designing and synthesizing antimetabolites that could destroy cancer cells was George Hitchings, head of the department of biochemistry at Burroughs Wellcome Co. In 1942, Hitchings initiated his DNA-based antimetabolite program, and in 1948, he and Gertrude Elion synthesized and demonstrated the anticancer activity of 2,6-diaminopurine. By fine-tuning the structure of the toxic compound, Elion synthesized 6-mercaptopurine, a successful therapeutic for treating acute leukemia. Hitchings, Elion, and Sir James W. Black won the Nobel Prize in Physiology or Medicine in 1988 for their discoveries of “important principles for drug treatment,” which constituted the groundwork for rational drug design.

The discovery of corticosteroids as a therapeutic can be linked to Thomas Addison, who made the connection between the adrenal glands and the rare Addison’s disease in 1855. But it wasn’t until Edward Calvin Kendall at the Mayo Clinic and Thadeus Reichstein at the University of Basel independently isolated several hormones from the adrenal cortex that corticosteroids were used to treat a more widespread malady. In 1948, Kendall and Philip S. Hench demonstrated the successful treatment of patients with rheumatoid arthritis using cortisone. Kendall, Reichstein, and Hench received the 1950 Nobel Prize in Physiology or Medicine for determining the structure and biological effects of adrenal cortex hormones.

One of the first therapeutic drugs to prevent cardiovascular disease also came from this period. While investigating the mysterious deaths of farm cows, Karl Paul Link at the University of Wisconsin proved that the loss of clotting ability in cattle was linked to the intake of sweet clover. He and his colleagues then isolated the anticoagulant and blood thinner dicoumarol (warfarin) from coumarin, a substance found in sweet clover, in 1940.

Many other advances
The synthesis, isolation, and therapeutic applications of miracle drugs may be the most well-remembered discoveries of the 1940s for medical chemists and biochemists, but advances in experimental genetics, biology, and virology were also happening. These advances include isolating the influenza B virus in 1940, by Thomas Francis at New York University and, independently, by Thomas Pleines Magill. Also in 1940, at the New York Hospital–Cornell University Medical Center, Mary Loveless succeeded in blocking the generation of immunotherapy-induced antibodies using pollen extracts. Routine use of the electron microscope in virology followed the first photos of tobacco-mosaic virus by Helmut Ruska, an intern at the Charité Medical School of Berlin University, in 1939; and the 1940s also saw numerous breakthroughs in immunology, including the first description of phagocytosis by a neutrophil.

In 1926, Hermann J. Muller, a professor at the University of Texas at Austin, reported the identification of several irradiation-induced genetic alterations, or mutations, in Drosophila that resulted in readily observed traits. This work, which earned Muller the Nobel Prize in Physiology or Medicine in 1946, enabled scientists to recognize mutations in genes as the cause of specific phenotypes, but it was still unclear how mutated genes led to the observed phenotypes.

In 1935, George Wells Beadle began studying the development of eye pigment in Drosophila with Boris Ephrussi at the Institut de Biologie Physico-Chimique in Paris. Beadle then collaborated with Edward Lawrie Tatum when they both joined Stanford in 1937—Beadle as a professor of biology (genetics) and Tatum as a research associate in the department of biological sciences. Tatum, who had a background in chemistry and biochemistry, handled the chemical aspects of the Drosophila eye-color study. Beadle and Tatum eventually switched to the fungus Neurospora crassa, a bread mold. After producing mutants of Neurospora by irradiation and searching for interesting phenotypes, they found several auxotrophs—strains that grow normally on rich media but cannot grow on minimal medium. Each mutant required its own specific nutritional supplement, and each requirement correlated to the loss of a compound normally synthesized by the organism. By determining that each mutant evoked a deficiency in a specific metabolic pathway, which was known to be controlled by enzymes, Beadle and Tatum concluded in a 1940 report that each gene produced a single enzyme, also called the “single gene–single enzyme” concept. The two scientists shared the Nobel Prize in Physiology or Medicine in 1958 for discovering that genes regulate the function of enzymes and that each gene controls a specific enzyme.

Also recognized with the same prize in 1958 was Joshua Lederberg. As a graduate student in Tatum’s laboratory in 1946, Lederberg found that some plasmids enable bacteria to transfer genetic material to each other by forming direct cell–cell contact in a process called conjugation. He also showed that F (fertility) factors allowed conjugation to occur. In addition, Lederberg defined the concepts of generalized and specialized transduction, collaborated with other scientists to develop the selection theory of antibody formation, and demonstrated that penicillin-susceptible bacteria could be grown in the antibiotic’s presence if a hypotonic medium was used.

In the field of virology, John Franklin Enders, Thomas H. Weller, and Frederick Chapman Robbins at the Children’s Hospital Medical Center in Boston figured out in 1949 how to grow poliovirus in test-tube cultures of human tissues—a technique enabling the isolation and study of viruses. Polio, often referred to as infantile paralysis, was one of the most feared diseases of the era. These researchers received the Nobel Prize in Physiology or Medicine in 1954.

Salvador Luria, at Indiana University, and Alfred Day Hershey, at Washington University’s School of Medicine, demonstrated that the mutation of bacteriophages makes it difficult for a host to develop immunity against viruses. In 1942, Thomas Anderson and Luria photographed and characterized E. coli T2 bacteriophages using an electron microscope. Luria and Max Delbrück, at Vanderbilt University, used statistical methods to demonstrate that inheritance in bacteria follows Darwinian principles. Luria, Hershey, and Delbrück were awarded the Nobel Prize in Physiology or Medicine in 1969 for elucidating the replication mechanism and genetic structure of viruses.

Although these discoveries were made outside of the pharmaceutical industry, their applications contributed enormously to understanding the mechanisms of diseases and therapeutic drugs.

Biological and chemical warfare
Biological warfare—the use of disease to harm or kill an adversary’s military forces, population, food, and livestock—can involve any living microorganism, nonliving virus, or bioactive substance deliverable by conventional artillery. The history of biological warfare can be traced back to the Romans, who used dead animals to infect their enemies’ water supply. The United States started a biological warfare program in 1942 after obtaining Japanese data about the destructive use of chemical and biological agents from pardoned war criminals. Japan sprayed bubonic plague over parts of mainland China on five occasions in 1941. Despite the fact that the spraying was ineffective, the attempts prompted the United States to develop its biological warfare program. Later, the developing Cold War further stimulated this research in the United States—and in the Soviet Union.

Ironically, the first chemotherapeutic agent for cancer came from an early instance of chemical warfare. Initially used as a weapon in World War I, mustard gas proved useful in treating mice and a person with lymphoma in 1942, when Alfred Gilman and Fred Phillips experimentally administered the chemical weapon as a therapeutic. Because the patient showed some improvement, chemical derivatives of mustard gas were developed and used to treat various cancers.

The Nuremberg Code
Not only did World War II encourage the discovery and development of antibiotics and antidisease drugs, it also instigated the need to define what constitutes permissible medical experiments on human subjects. The Nazis performed cruel and criminal “medical” experiments on Jews and other prisoners during the war. In 1949, the Nuremberg Code was established in an effort to prevent medical crimes against humanity. The Code requires that individuals enrolled in clinical trials give voluntary consent. The experiment must hypothetically achieve useful results for the good of society, be performed by scientifically qualified persons, and be derived from experiments on animal models that suggest the anticipated outcome will justify human clinical experiments. The code also emphasizes that all physical and mental suffering must be avoided and that precautions must be taken to protect the human subject if injury or disability results from the experiment. In achieving its goals, the Nuremberg Code necessarily empowers the human subject and holds the researcher responsible for inflicting unnecessary pain and suffering on the human subject.
Suggested reading
  • “The Early Years of the Penicillin Discovery”

  • Chain, E. (TIPS, 1979, 6–11)
  • Fighting for Life: American Military Medicine in World War II

  • Cowdrey, A. E (The Free Press: New York, 1994)
  • Penicillin: Meeting the Challenge 

  • Hobby, G. (Yale University Press: New Haven, CT, 1985)
  • Biological Weapons: Limiting the Threat 

  • Lederberg, J. S., Ed. (MIT Press: Cambridge, MA, 1999) 
  • The Antibiotic Era 

  • Waksman, S. A. (The Waksman Foundation of Japan: Tokyo, 1975)
  • Scientific Contributions of Selman A. Waksman: Selected Articles Published in honor of his 80th birthday, July 22, 1968 

  • Woodruff, H. B., Ed. (Rutgers University Press: New Brunswick, NJ, 1968) 
On the practical level, it was not until the 1960s that institutionalized protections for subjects in clinical trials and human experimentation were put into place.

“Swing” time
The 1940s ended with the antibiotic era in full swing and with a host of wartime advancements in fermentation and purification technologies changing the drug development process. Penicillin and DDT became the chemical markers of the age, promising to heal the world—curing the plagues and killing the plague carriers. The radioisotopes now easily produced through advances in technology promoted by the war were becoming routinely available for health research, as the era of computer-aided drug analysis began. The baby boom launched by postwar U.S. prosperity produced the first generation born with the expectation of health through drugs and medical intervention.

Because of these new possibilities, health became a political as well as a social issue. The leading role science played in the Allied victory gave way in the postwar 1940s to its new role as medical savior. The new technologies that became available in the 1940s—including partition chromatography, infrared and mass spectrometry, as well as nuclear magnetic resonance (NMR)—would eventually become critical to pharmaceutical progress.


© 2000 American Chemical Society
Analytical Chemistry Chemical and Engineering News Modern Drug Discovery Today's Chemist at Work

CASChemPortChemCenterPubs Page